14 research outputs found

    On the complexity of Jensen's algorithm for counting fixed polyominoes

    Get PDF
    AbstractRecently I. Jensen published a novel transfer-matrix algorithm for computing the number of polyominoes in a rectangular lattice. However, his estimation of the computational complexity of the algorithm (O((2)n), where n is the size of the polyominoes), was based only on empirical evidence. In contrast, our research provides some solid proof. Our result is based primarily on an analysis of the number of some class of strings that plays a significant role in the algorithm. It turns out that this number is closely related to Motzkin numbers. We provide a rigorous computation that roughly confirms Jensen's estimation. We obtain the bound O(n5/2(3)n) on the running time of the algorithm, while the actual number of polyominoes is about C4.06n/n, for some constant C>0

    Enhancing GDPR compliance through data sensitivity and data hiding tools

    Get PDF
    Since the emergence of GDPR, several industries and sectors are setting informatics solutions for fulfilling these rules. The Health sector is considered a critical sector within the Industry 4.0 because it manages sensitive data, and National Health Services are responsible for managing patients’ data. European NHS are converging to a connected system allowing the exchange of sensitive information cross different countries. This paper defines and implements a set of tools for extending the reference architectural model industry 4.0 for the healthcare sector, which are used for enhancing GDPR compliance. These tools are dealing with data sensitivity and data hiding tools A case study illustrates the use of these tools and how they are integrated with the reference architectural model

    Hunting Trojan Horses

    No full text
    In this report we present HTH (Hunting Trojan Horses), a security framework for detecting Trojan Horses and Backdoors. The framework is composed of two main parts: 1) Harrier – an application security monitor that performs run-time monitoring to dynamically collect execution-related data, and 2) Secpert – a security-specific Expert System based on CLIPS, which analyzes the events collected by Harrier. Our main contributions to the security research are three-fold. First we identify common malicious behaviors, patterns, and characteristics of Trojan Horses and Backdoors. Second we develop a security policy that can identify such malicious behavior and open the door for effectively using expert systems to implement complex security policies. Third, we construct a prototype that successfully detects Trojan Horses and Backdoors.

    Towards Trustworthy Virtualisation Environments: Xen Library OS Security Service Infrastructure

    No full text
    trusted computing, virtualization, Xen hypervisor New cost effective commodity PC hardware now includes fully virtualisable processors and the Trusted Computing Group's trusted platform module (TPM). This provides the opportunity to combine virtualisation, trusted computing and open source software development to tackle the security challenges modern computing faces. We believe that leveraging this technology to partition critical operating system services and applications into small modules with strictly controlled interactions is a good way to improve trustworthiness. To support the development of small applications running in Xen domains we built a library OS. We ported the GNU cross-development tool chain and standard C libraries to the small operating system kernel included with the Xen distribution, and wrote an inter-domain communications (IDC) library for communications between Xen domains. To confirm the usability of our library OS we ported a software TPM to run on it as a typical application. We evaluated the performance of our IDC system and showed that it has good performance for the applications we envisage. We have shown that a lightweight library OS offers a convenient and practical way of reducing the trusted computing base of applications by running security sensitive components in separate Xen domains

    Selecting long atomic traces for high coverage

    No full text
    This paper performs a comprehensive investigation of dynamic selection for long atomic traces. It introduces a classification of trace selection methods and discusses existing and novel dynamic selection approaches – including loop unrolling, procedure inlining and incremental merging of traces based on dynamic bias. The paper empirically analyzes a number of selection schemes in an idealized framework. Observations based on the SPEC-CPU2000 benchmarks show that: (a) selection based on dynamic bias is necessary to achieve the best performance across all benchmarks, (b) the best selection scheme is benchmark and maximum trace-length specific, (c) simple selection, based on program structure information only, is sufficient to achieve the best performance for several benchmarks. Consequently, two alternatives for the trace selection mechanism are established: (a) a “best performance ” approach relying on complex dynamic criteria; (b) a “value ” approach that provides the best performance (and potentially the best power consumption) based on simpler static criteria. Another emerging alternative advocates adaptive based mechanisms to adjust selection criteria

    COUNTING POLYOMINOES ON TWISTED CYLINDERS

    No full text
    Using numerical methods, we analyze the growth in the number of polyominoes on a twisted cylinder as the number of cells increases. These polyominoes are related to classical polyominoes (connected subsets of a square grid) that lie in the plane. We thus obtain improved lower bounds on the growth rate of the number of these polyominoes, which is also known as Klarner’s constant
    corecore